Nonlinear Metric Learning for kNN and SVMs through Geometric Transformations
نویسندگان
چکیده
In recent years, research efforts to extend linear metric learning models to handle nonlinear structures have attracted great interests. In this paper, we propose a novel nonlinear solution through the utilization of deformable geometric models to learn spatially varying metrics, and apply the strategy to boost the performance of both kNN and SVM classifiers. Thin-plate splines (TPS) are chosen as the geometric model due to their remarkable versatility and representation power in accounting for high-order deformations. By transforming the input space through TPS, we can pull same-class neighbors closer while pushing different-class points farther away in kNN, as well as make the input data points more linearly separable in SVMs. Improvements in the performance of kNN classification are demonstrated through experiments on synthetic and real world datasets, with comparisons made with several state-of-the-art metric learning solutions. Our SVMbased models also achieve significant improvements over traditional linear and kernel SVMs with the same datasets.
منابع مشابه
Distance Metric Learning for Large Margin Nearest Neighbor Classification
We show how to learn aMahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven data sets of varying size and difficulty, we find that metrics trained in this way lead to signif...
متن کاملAdaptive KNN Classification Based on Laplacian Eigenmaps and Kernel Mixtures
K Nearest Neighbor (kNN) is one of the most popular machine learning techniques, but it often fails to work well with inappropriate choice of distance metric or due to the presence of a lot of irrelated features. Linear and non-linear feature transformation methods have been applied to extract classrelevant information to improve kNN classification. In this paper, I describe kNN classification ...
متن کاملLarge-Margin kNN Classification Using a Deep Encoder Network
KNN is one of the most popular classification methods, but it often fails to work well with inappropriate choice of distance metric or due to the presence of numerous class-irrelevant features. Linear feature transformation methods have been widely applied to extract class-relevant information to improve kNN classification, which is very limited in many applications. Kernels have been used to l...
متن کاملAn Effective Approach for Robust Metric Learning in the Presence of Label Noise
Many algorithms in machine learning, pattern recognition, and data mining are based on a similarity/distance measure. For example, the kNN classifier and clustering algorithms such as k-means require a similarity/distance function. Also, in Content-Based Information Retrieval (CBIR) systems, we need to rank the retrieved objects based on the similarity to the query. As generic measures such as ...
متن کاملGeometric Approach to Statistical Learning Theory through Support Vector Machines (SVM) with Application to Medical Diagnosis
This dissertation deals with problems of Pattern Recognition in the framework of Machine Learning (ML) and, specifically, Statistical Learning Theory (SLT), using Support Vector Machines (SVMs). The focus of this work is on the geometric interpretation of SVMs, which is accomplished through the notion of Reduced Convex Hulls (RCHs), and its impact on the derivation of new, efficient algorithms ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1508.01534 شماره
صفحات -
تاریخ انتشار 2015